feat: add MiniMax as first-class LLM provider#300
Open
octo-patch wants to merge 1 commit intoalgorithmicsuperintelligence:mainfrom
Open
feat: add MiniMax as first-class LLM provider#300octo-patch wants to merge 1 commit intoalgorithmicsuperintelligence:mainfrom
octo-patch wants to merge 1 commit intoalgorithmicsuperintelligence:mainfrom
Conversation
|
|
Add MiniMax AI (https://www.minimax.io/) as a directly supported LLM provider alongside OpenAI, Cerebras, and Azure OpenAI. MiniMax's API is OpenAI-compatible, so this uses the OpenAI SDK with MiniMax's base URL for seamless integration. Changes: - Add MINIMAX_API_KEY detection in get_config() with auto base URL - Add temperature clamping for MiniMax: values are clamped to (0, 1] - Update README provider table with MiniMax documentation - Add 17 unit tests covering provider detection, priority, and temp clamping - Add 3 integration tests for live API verification
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
MINIMAX_API_KEYenv var and uses its OpenAI-compatible API athttps://api.minimax.io/v1Changes
Provider Detection (
optillm/server.py)MINIMAX_API_KEYcheck inget_config()between Cerebras and OpenAI in priority orderhttps://api.minimax.io/v1) by defaultbase_urloverride via--base-urlflagTemperature Clamping (
optillm/server.py)MINIMAX_API_KEYis set; other providers are unaffectedDocumentation (
README.md)Tests
tests/test_minimax_provider.py): provider detection, base URL, SSL, priority ordering, temperature clampingtests/test_minimax_integration.py): basic completion, temperature boundary, streaming (requires live API key)Provider Priority Order
OPTILLM_API_KEY- Local inferenceCEREBRAS_API_KEY- CerebrasMINIMAX_API_KEY- MiniMax (new)OPENAI_API_KEY- OpenAIAZURE_OPENAI_API_KEY- Azure OpenAIUsage
Test plan